Modal File
Ollama Modelfile Syntax & Parameters
ollama create mymodel -f Modelfile
What: Create a custom model from a Modelfile.
Why: Lets you modify parameters, system prompts, and templates.
How: Write a Modelfile
with your settings, then run this command.
Example:
FROM llama2:7b
PARAMETER temperature 0.7
SYSTEM "You are a helpful assistant."
ollama run mymodel
What: Runs your custom model.
Why: Test and interact with your modified model.
How: Use after creating the model.
Example:
ollama run mymodel
FROM llama2:7b
What: Defines base model in Modelfile.
Why: All custom models must start from a base.
How: First line of your Modelfile
.
Example:
FROM llama2:7b
PARAMETER temperature 0.7
What: Controls randomness of output.
Why: Adjusts creativity (0.0 = deterministic, 1.0 = very creative).
How: Add inside Modelfile
.
Example:
PARAMETER temperature 0.7
PARAMETER num_ctx 2048
What: Sets maximum context tokens.
Why: Allows longer memory in conversations.
How: Inside Modelfile
.
Example:
PARAMETER num_ctx 2048
PARAMETER stop "###"
What: Defines stop sequence.
Why: Stops output when marker is reached.
How: Inside Modelfile
.
Example:
PARAMETER stop "###"
SYSTEM "You are a helpful assistant."
What: Defines system prompt.
Why: Keeps tone/personality consistent.
How: Inside Modelfile
.
Example:
SYSTEM "You are a helpful assistant."
TEMPLATE """
User: {{ .Prompt }}
Assistant:
"""
User: {{ .Prompt }}
Assistant:
"""
What: Sets prompt structure.
Why: Enforces consistent I/O formatting.
How: Inside Modelfile
.
Example:
TEMPLATE """
User: {{ .Prompt }}
Assistant:
"""
LICENSE "MIT"
What: Adds license information.
Why: Clarifies usage rights.
How: Inside Modelfile
.
Example:
LICENSE "MIT"
MESSAGE "Trained on 2023 company FAQ."
What: Adds a permanent note in the model history.
Why: Stores metadata about training/data origin.
How: Inside Modelfile
.
Example:
MESSAGE "Trained on 2023 company FAQ."